- Markov property
- марковость
Англо-русский технический словарь.
Англо-русский технический словарь.
Markov property — In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov.[1] A stochastic process has the Markov property if the… … Wikipedia
Markov decision process — Markov decision processes (MDPs), named after Andrey Markov, provide a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for… … Wikipedia
Markov perfect equilibrium — A solution concept in game theory Relationships Subset of Subgame perfect equilibrium Significance Proposed by … Wikipedia
Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized … Wikipedia
Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… … Wikipedia
Markov random field — A Markov random field, Markov network or undirected graphical model is a set of variables having a Markov property described by an undirected graph. A Markov random field is similar to a Bayesian network in its representation of dependencies. It… … Wikipedia
Markov network — A Markov network, or Markov random field, is a model of the (full) joint probability distribution of a set mathcal{X} of random variables having the Markov property. A Markov network is similar to a Bayesian network in its representation of… … Wikipedia
Markov model — In probability theory, a Markov model is a stochastic model that assumes the Markov property. Generally, this assumption enables reasoning and computation with the model that would otherwise be intractable. Contents 1 Introduction 2 Markov chain… … Wikipedia
Markov partition — A Markov partition is a tool used in dynamical systems theory, allowing the methods of symbolic dynamics to be applied to the study of hyperbolic systems. By using a Markov partition, the system can be made to resemble a discrete time Markov… … Wikipedia
Markov blanket — In a Bayesian network, the Markov blanket of node A includes its parents, children and the other parents of all of its children. In machine learning, the Markov blanket for a node A in a Bayesian network is the set of nodes composed of A s… … Wikipedia
Markov strategy — In game theory, a Markov strategy is one that does not depend at all on state variables that are functions of the history of the game, except those that affect payoffs. In other words: a player playing a Markovian strategy, conditions his action… … Wikipedia